Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
Add more filters










Database
Language
Publication year range
1.
Dent J (Basel) ; 12(4)2024 Apr 03.
Article in English | MEDLINE | ID: mdl-38668007

ABSTRACT

The dental pulp chamber volume is a fundamental measurement in the field of endodontics, but also in forensic sciences, teaching and training, or tissue engineering. This study evaluates the precision of cone-beam computed tomography (CBCT) in comparison with computed micro-tomography (micro-CT) in evaluating the pulp chamber volume of the upper central incisors ex vivo. The intra-operator and inter-operator errors were evaluated, and the results for the two techniques were compared with those of a T-test for paired samples. The intra-operator and inter-operator errors were >0.05, indicating adequate reproducibility in each operator and no significant differences between their measurements. On the other hand, no significant differences between the two measurement techniques were found. The present results demonstrate that CBCT is a precise, feasible, and reproducible technique for the evaluation of the dental pulp chamber volume ex vivo. The results provided with this method are useful for different medical domains but also for the teaching and training of undergraduate and postgraduate students. Furthermore, the findings of this study carry significant clinical implications, as the accurate assessment of the pulp chamber volume is critical in the diagnosis and treatment of various endodontic conditions. The ability of CBCT to provide reliable 3D dental anatomy measurements can enhance the planning of endodontic treatments by allowing for a better understanding of the internal tooth morphology. Additionally, the precision and reproducibility of CBCT in assessing the pulp chamber volume can contribute to improved clinical outcomes and reduced complications during endodontic procedures. These findings further support the increasingly vital role of CBCT in modern clinical practice and underscore its value as an indispensable tool in the field of dentistry.

2.
Comput Math Methods Med ; 2021: 5556433, 2021.
Article in English | MEDLINE | ID: mdl-34422090

ABSTRACT

The prediction of the dynamics of the COVID-19 outbreak and the corresponding needs of the health care system (COVID-19 patients' admissions, the number of critically ill patients, need for intensive care units, etc.) is based on the combination of a limited growth model (Verhulst model) and a short-term predictive model that allows predictions to be made for the following day. In both cases, the uncertainty analysis of the prediction is performed, i.e., the set of equivalent models that adjust the historical data with the same accuracy. This set of models provides the posterior distribution of the parameters of the predictive model that adjusts the historical series. It can be extrapolated to the same analyzed time series (e.g., the number of infected individuals per day) or to another time series of interest to which it is correlated and used, e.g., to predict the number of patients admitted to urgent care units, the number of critically ill patients, or the total number of admissions, which are directly related to health needs. These models can be regionalized, that is, the predictions can be made at the local level if data are disaggregated. We show that the Verhulst and the Gompertz models provide similar results and can be also used to monitor and predict new outbreaks. However, the Verhulst model seems to be easier to interpret and to use.


Subject(s)
COVID-19/epidemiology , Models, Biological , Pandemics , SARS-CoV-2 , COVID-19/transmission , Computational Biology , Health Services Needs and Demand , Humans , Mathematical Concepts , Models, Statistical , Pandemics/statistics & numerical data , Spain/epidemiology , Time Factors
3.
BMC Bioinformatics ; 21(Suppl 2): 89, 2020 Mar 11.
Article in English | MEDLINE | ID: mdl-32164540

ABSTRACT

BACKGROUND: Phenotype prediction problems are usually considered ill-posed, as the amount of samples is very limited with respect to the scrutinized genetic probes. This fact complicates the sampling of the defective genetic pathways due to the high number of possible discriminatory genetic networks involved. In this research, we outline three novel sampling algorithms utilized to identify, classify and characterize the defective pathways in phenotype prediction problems, such as the Fisher's ratio sampler, the Holdout sampler and the Random sampler, and apply each one to the analysis of genetic pathways involved in tumor behavior and outcomes of triple negative breast cancers (TNBC). Altered biological pathways are identified using the most frequently sampled genes and are compared to those obtained via Bayesian Networks (BNs). RESULTS: Random, Fisher's ratio and Holdout samplers were more accurate and robust than BNs, while providing comparable insights about disease genomics. CONCLUSIONS: The three samplers tested are good alternatives to Bayesian Networks since they are less computationally demanding algorithms. Importantly, this analysis confirms the concept of "biological invariance" since the altered pathways should be independent of the sampling methodology and the classifier used for their inference. Nevertheless, still some modifications are needed in the Bayesian networks to be able to sample correctly the uncertainty space in phenotype prediction problems, since the probabilistic parameterization of the uncertainty space is not unique and the use of the optimum network might falsify the pathways analysis.


Subject(s)
Algorithms , Triple Negative Breast Neoplasms/pathology , Bayes Theorem , Databases, Genetic , Female , Gene Expression Regulation, Neoplastic , Gene Regulatory Networks , Humans , Neoplasm Metastasis , Phenotype , Survival Analysis , Triple Negative Breast Neoplasms/genetics , Triple Negative Breast Neoplasms/mortality
4.
Mech Ageing Dev ; 182: 111129, 2019 09.
Article in English | MEDLINE | ID: mdl-31445068

ABSTRACT

Sarcopenia is an age-related multifactorial process that involved several biological mechanisms, whose specific contribution and interplay is still unknown. The present study proposes prognostic networks based on machine learning approaches to unravel the interplay among those biological mechanisms mainly involved in the development of Sarcopenia. After analyzing 114 biological and clinical variables in adults older than 70 years, and using all the biological prognostic networks detected by machine learning with accuracy higher than 82%, we designed a consensus classifier based on majority vote that improve the predictive accuracy of Sarcopenia up to 91%. Additionally, we applied logistic regression analysis to propose the interplay among the most discriminative biological variables of Sarcopenia: anthropometry, body composition, functional performance of lower limbs, systemic oxidative stress, presence of depression and medication for the digestive system based on proton-pump inhibitors. Our data also demonstrate that besides a loss of muscle mass, impairments on functional performance of lower limbs are more relevant for develop Sarcopenia than those affecting the muscle strength.


Subject(s)
Machine Learning , Sarcopenia , Aged , Aged, 80 and over , Female , Humans , Male , Prognosis , Sarcopenia/diagnosis , Sarcopenia/metabolism , Sarcopenia/pathology
5.
Histopathology ; 75(6): 916-930, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31342542

ABSTRACT

AIMS: It is known that matrix metalloproteinase (MMP)-11 has a role in tumour development and progression, and also that immune cells can influence cancer cells to increase their proliferative and invasive properties. The aim of the present study was to propose the evaluation of MMP11 expression by intratumoral mononuclear inflammatory cells (MICs) as a useful biological marker for breast cancer prognosis. METHODS AND RESULTS: This study comprised 246 women with invasive breast carcinoma, and a long follow-up period. Patients were stratified with regard to nodal status and to the development of metastatic disease. The median follow-up period in patients without metastasis was 146 months and in patients with metastatic disease 31 months. MMP11 was determined by immunohistochemistry. For relapse-free survival (RFS) and overall survival (OS) analysis we used the Cox's univariate method. Cox's regression model was used to examine the interactions between different prognostic factors in a multivariate analysis. CONCLUSIONS: Our results showed that MMP11 expression by stromal cells was significantly associated with prognosis. MMP11 expression by cancer-associated fibroblasts (CAFs) was associated with both shortened RFS and OS, but MMP11 expression by MICs showed a stronger association with both shortened RFS and OS, therefore being the most potent and independent factor to predict RFS and OS.


Subject(s)
Breast Neoplasms/diagnosis , Gene Expression Regulation, Neoplastic , Matrix Metalloproteinase 11/metabolism , Breast/pathology , Breast Neoplasms/pathology , Cancer-Associated Fibroblasts/pathology , Disease-Free Survival , Female , Humans , Immunohistochemistry , Inflammation/pathology , Kaplan-Meier Estimate , Middle Aged , Multivariate Analysis , Neoplasm Metastasis , Prognosis , Stromal Cells/pathology
6.
J Mol Model ; 25(3): 79, 2019 Feb 27.
Article in English | MEDLINE | ID: mdl-30810816

ABSTRACT

We discuss the relationship between the problem of protein tertiary structure prediction from the amino acid sequence and the uncertainty analysis. The algorithm presented in this paper belongs to the category of decoy-based modeling, where different known protein models are used to establish a low dimensional space via principal component analysis. The low dimensional space is utilized to perform an energy optimization via a family of very explorative particle swarm optimizers to find the global minimum. The aim of this procedure is to get a representative sample of the nonlinear equivalent region, that is, protein models that have their energy lower than a certain energy bound. The posterior analysis of this family provides very valuable information about the backbone structure of the native conformation and its possible alternate states. This methodology has the advantage of being simple and fast and can help refine the tertiary protein structure. We comprehensively illustrate the performance of our algorithm on one protein from the CASP-9 protein structure prediction experiment. We also provide a theoretical analysis of the energy landscape found in the tertiary structure protein inverse problem, explaining why model reduction techniques (principal component analysis in this case) serve to alleviate the ill-posed character of this high dimensional optimization problem. In addition, we expand the computational benchmark with a summary of other CASP-9 proteins in the Appendix.


Subject(s)
Caspase 9/chemistry , Computational Biology/methods , Algorithms , Amino Acid Sequence , Computer Simulation , Models, Molecular , Principal Component Analysis , Protein Folding , Protein Structure, Tertiary , Proteins/chemistry , Uncertainty
7.
J Bioinform Comput Biol ; 16(2): 1850005, 2018 04.
Article in English | MEDLINE | ID: mdl-29566640

ABSTRACT

We discuss applicability of principal component analysis (PCA) for protein tertiary structure prediction from amino acid sequence. The algorithm presented in this paper belongs to the category of protein refinement models and involves establishing a low-dimensional space where the sampling (and optimization) is carried out via particle swarm optimizer (PSO). The reduced space is found via PCA performed for a set of low-energy protein models previously found using different optimization techniques. A high frequency term is added into this expansion by projecting the best decoy into the PCA basis set and calculating the residual model. This term is aimed at providing high frequency details in the energy optimization. The goal of this research is to analyze how the dimensionality reduction affects the prediction capability of the PSO procedure. For that purpose, different proteins from the Critical Assessment of Techniques for Protein Structure Prediction experiments were modeled. In all the cases, both the energy of the best decoy and the distance to the native structure have decreased. Our analysis also shows how the predicted backbone structure of native conformation and of alternative low energy states varies with respect to the PCA dimensionality. Generally speaking, the reconstruction can be successfully achieved with 10 principal components and the high frequency term. We also provide a computational analysis of protein energy landscape for the inverse problem of reconstructing structure from the reduced number of principal components, showing that the dimensionality reduction alleviates the ill-posed character of this high-dimensional energy optimization problem. The procedure explained in this paper is very fast and allows testing different PCA expansions. Our results show that PSO improves the energy of the best decoy used in the PCA when the adequate number of PCA terms is considered.


Subject(s)
Computational Biology/methods , Models, Molecular , Principal Component Analysis , Protein Structure, Tertiary , Proteins/chemistry , Proteins/metabolism , Uracil-DNA Glycosidase/chemistry , Uracil-DNA Glycosidase/metabolism
8.
Materials (Basel) ; 9(7)2016 Jun 29.
Article in English | MEDLINE | ID: mdl-28773653

ABSTRACT

The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.

SELECTION OF CITATIONS
SEARCH DETAIL
...